A new gradient method via least change secant update

نویسندگان

  • Wah June Leong
  • Malik Abu Hassan
چکیده

The Barzilai–Borwein (BB) gradient method is favourable over the classical steepest descent method both in theory and in real computations. This method takes a 'fixed' step size rather than following a set of line search rules to ensure convergence. Along this line, we present a new approach for the two-point approximation to the quasi-Newton equation within the BB framework on the basis of a well-known least change result for the Davidon–Fletcher–Powell update and propose a new gradient method that belongs to the same class of BB gradient method in which the line search procedure is replaced by a fixed step size. Some preliminary numerical results suggest that improvements have been achieved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The modified BFGS method with new secant relation ‎for unconstrained optimization problems‎

Using Taylor's series we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, based on this modified secant relation we present a new BFGS method for solving unconstrained optimization problems. The proposed method make use of both gradient and function values while the usual secant relation uses only gradient values. U...

متن کامل

On the Relation between Two Local Convergence Theories of Least-change Secant Update Methods

In this paper, we show that the main results of the local convergence theory for least-change secant update methods of Dennis and Walker (SIAM J. Numer. Anal. 18 (1981), 949-987) can be proved using the theory introduced recently by Martinez (Math. Comp. 55 (1990), 143-167). In addition, we exhibit two generalizations of well-known methods whose local convergence can be easily proved using Mart...

متن کامل

Local Convergence Theory of Inexact Newton Methods Based on Structured Least Change Updates

In this paper we introduce a local convergence theory for Least Change Secant Update methods. This theory includes most known methods of this class, as well as some new interesting quasi-Newton methods. Further, we prove that this class of LCSU updates may be used to generate iterative linear methods to solve the Newton linear equation in the Inexact-Newton context. Convergence at a ¡j-superlin...

متن کامل

Improved Hessian approximation with modified secant equations for symmetric rank-one method

Symmetric rank-one (SR1) is one of the competitive formulas among the quasi-Newton (QN) methods. In this paper, we propose some modified SR1 updates based on the modified secant equations, which use both gradient and function information. Furthermore, to avoid the loss of positive definiteness and zero denominators of the new SR1 updates, we apply a restart procedure to this update. Three new a...

متن کامل

A secant-based Nesterov method for convex functions

A simple secant-based fast gradient method is developed for problems whose objective function is convex and well-defined. The proposed algorithm extends the classical Nesterov gradient method by updating the estimate-sequence parameter with secant information whenever possible. This is achieved by imposing a secant condition on the choice of search point. Furthermore, the proposed algorithm emb...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Int. J. Comput. Math.

دوره 88  شماره 

صفحات  -

تاریخ انتشار 2011